Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Mixture of Experts Fine-tuning
# Mixture of Experts Fine-tuning
Pantheon Proto RP 1.8 30B A3B
Apache-2.0
A Mixture of Experts (MoE) role-playing model based on Qwen3-30B-A3B-Base, supporting multi-role precision acting and diverse interactive experiences
Large Language Model
English
P
Gryphe
596
18
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase